disparate impact theory
Trump signs education-focused executive orders on AI, school discipline, accreditation, foreign gifts and more
Former Education Secretary Bill Bennett discusses the Supreme Court case that will evaluate parents' rights to opt out of classes where LGBTQ books are being used in the curriculum on'The Story.' President Donald Trump signed multiple Executive Orders relating to education Wednesday afternoon, with several tied to the theme of returning meritocracy back to the education system. The orders, seven in total, included actions to integrate artificial intelligence into K-12 school curricula, reforms to school discipline and accreditation guidelines, requirements related to the disclosure of foreign funding to schools and enhancements to the country's workforce development programs. Trump's slew of education-focused orders also included another directive demanding an end to DEI ideology in schools, specifically the use of "disparate impact theory," on top of his previous executive order from January ordering an end to DEI-like programming and ideology in K-12 schools. An Executive Order setting up a White House initiative supporting the efficiency and effectiveness of Historically Black Colleges and Universities was also signed by the president on Wednesday. President Donald Trump holds an executive order relating to education in the Oval Office of the White House, Wednesday, April 23, 2025, in Washington, as Commerce Secretary Howard Lutnick, Labor Secretary Lori Chavez-DeRemer and Education Secretary Linda McMahon watch.
- North America > United States > New Jersey (0.05)
- North America > United States > Mississippi (0.05)
- Government > Regional Government > North America Government > United States Government (1.00)
- Education (1.00)
Addressing Algorithmic Discrimination
It should no longer be a surprise that algorithms can discriminate. A criminal risk-assessment algorithm is far more likely to erroneously predict a Black defendant will commit a crime in the future than a white defendant.2 Ad-targeting algorithms promote job opportunities to race- and gender-skewed audiences, showing secretary and supermarket job ads to far more women than men.1 A hospital's resource-allocation algorithm favored white over Black patients with the same level of medical need.5 Algorithmic discrimination is particularly troubling when it affects consequential social decisions, such as who gets released from jail, or has access to a loan or health care. Employment is a prime example. Employers are increasingly relying on algorithmic tools to recruit, screen, and select job applicants by making predictions about which candidates will be good employees.
- North America > United States > New York > Erie County > Buffalo (0.05)
- North America > United States > Missouri > St. Louis County > St. Louis (0.05)
- North America > United States > California (0.05)
- Law (1.00)
- Health & Medicine (1.00)
- Education > Assessment & Standards (0.49)
- Education > Educational Setting (0.47)
Algorithms and bias: What lenders need to know White & Case LLP International Law Firm, Global Law Practice
Much of the software now revolutionizing the financial services industry depends on algorithms that apply artificial intelligence (AI)--and increasingly, machine learning--to automate everything from simple, rote tasks to activities requiring sophisticated judgment. These algorithms and the analyses that undergird them have become progressively more sophisticated as the pool of potentially meaningful variables within the Big Data universe continues to proliferate. When properly implemented, algorithmic and AI systems increase processing speed, reduce mistakes due to human error and minimize labor costs, all while improving customer satisfaction rates. Creditscoring algorithms, for example, not only help financial institutions optimize default and prepayment rates, but also streamline the application process, allowing for leaner staffing and an enhanced customer experience. When effective, these algorithms enable lenders to tweak approval criteria quickly and continually, responding in real time to both market conditions and customer needs. Both lenders and borrowers stand to benefit. For decades, financial services companies have used different types of algorithms to trade securities, predict financial markets, identify prospective employees and assess potential customers.
- Europe > United Kingdom (0.28)
- North America > United States > Texas (0.05)
- Oceania > Australia (0.04)
- Europe > Finland > Uusimaa > Helsinki (0.04)
- Law (1.00)
- Government > Regional Government > North America Government > United States Government (1.00)
- Banking & Finance > Financial Services (1.00)
- Information Technology > Artificial Intelligence > Machine Learning (1.00)
- Information Technology > Data Science > Data Mining > Big Data (0.35)
Algorithms and bias: What lenders need to know JD Supra
Much of the software now revolutionizing the financial services industry depends on algorithms that apply artificial intelligence (AI)--and increasingly, machine learning--to automate everything from simple, rote tasks to activities requiring sophisticated judgment. These algorithms and the analyses that undergird them have become progressively more sophisticated as the pool of potentially meaningful variables within the Big Data universe continues to proliferate. When properly implemented, algorithmic and AI systems increase processing speed, reduce mistakes due to human error and minimize labor costs, all while improving customer satisfaction rates. Creditscoring algorithms, for example, not only help financial institutions optimize default and prepayment rates, but also streamline the application process, allowing for leaner staffing and an enhanced customer experience. When effective, these algorithms enable lenders to tweak approval criteria quickly and continually, responding in real time to both market conditions and customer needs. Both lenders and borrowers stand to benefit. For decades, financial services companies have used different types of algorithms to trade securities, predict financial markets, identify prospective employees and assess potential customers. Although AIdriven algorithms seek to avoid the failures of rigid instructions-based models of the past--such as those linked to the 1987 "Black Monday" stock market crash or 2010's "Flash Crash"--these models continue to present potential financial, reputational and legal risks for financial services companies.
- Europe > United Kingdom (0.28)
- North America > United States > Texas (0.05)
- Oceania > Australia (0.04)
- Europe > Finland > Uusimaa > Helsinki (0.04)
- Information Technology > Artificial Intelligence > Machine Learning (1.00)
- Information Technology > Data Science > Data Mining > Big Data (0.35)